random segmentation

Terms from Artificial Intelligence: humans at the heart of algorithms

Random segmention is a way to break big data into subsets small enough to process. Rather than using a systematic segmetation rule, data is divided randomly into subsets of the desired size. As well as bieng simple to implement, it can have statistical advantages as it ensures differet kinds of item are spread uniformly acrss the data segments.

Used on page 162